نتایج جستجو برای: Discrete-time neural networks (DNNs)

تعداد نتایج: 2505214  

K. Meenakshi M. Syed Ali M. Usha N. Gunasekaran

This paper focuses on the problem of finite-time boundedness and finite-time passivity of discrete-time T-S fuzzy neural networks with time-varying delays. A suitable Lyapunov--Krasovskii functional(LKF) is established to derive sufficient condition for finite-time passivity of discrete-time T-S fuzzy neural networks. The dynamical system is transformed into a T-S fuzzy model with uncertain par...

2014
Ryu Takeda Naoyuki Kanda Nobuo Nukaga

The boundary contraction training for acoustic models based on deep neural networks in a discrete system (discrete DNNs) is presented in this paper. Representing the parameters of DNNs with small bits (such as 4 bits) can reduce not only memory usage but also computational complexity by utilizing a CPU cache and look-up tables efficiently. However, simply quantizing parameters of the normal con...

Journal: :CoRR 2017
Lei Deng Peng Jiao Jing Pei Zhenzhi Wu Guoqi Li

Although deep neural networks (DNNs) are being a revolutionary power to open up the AI era, the notoriously huge hardware overhead has challenged their applications. Recently, several binary and ternary networks, in which the costly multiplyaccumulate operations can be replaced by accumulations or even binary logic operations, make the on-chip training of DNNs quite promising. Therefore there i...

Journal: :I. J. Bifurcation and Chaos 2004
Guanrong Chen Jin Zhou Zengrong Liu

This paper formulates the model and then studies its dynamics of a system of linearly and diffusively coupled identical delayed neural networks (DNNs), which is generalization of delayed Hopfied neural networks (DHNNs) and delayed cellular neural networks (DCNNs). In particularly, a simple yet generic sufficient condition for global synchronization of such coupled DNNs is derived based on the L...

پایان نامه :وزارت علوم، تحقیقات و فناوری - دانشگاه تربیت مدرس - دانشکده علوم انسانی 1389

rivers and runoff have always been of interest to human beings. in order to make use of the proper water resources, human societies, industrial and agricultural centers, etc. have usually been established near rivers. as the time goes on, these societies developed, and therefore water resources were extracted more and more. consequently, conditions of water quality of the rivers experienced rap...

2016
Aaron Klein Stefan Falkner Jost Tobias Springenberg Frank Hutter

The performance of deep neural networks (DNNs) crucially relies on good hyperparameter settings. Since the computational expense of training DNNs renders traditional blackbox optimization infeasible, recent advances in Bayesian optimization model the performance of iterative methods as a function of time to adaptively allocate more resources to promising hyperparameter settings. Here, we propos...

2017
Jinxi Guo Ning Xu Li-Jia Li Abeer Alwan

Recently, neural networks with deep architecture have been widely applied to acoustic scene classification. Both Convolutional Neural Networks (CNNs) and Long Short-Term Memory Networks (LSTMs) have shown improvements over fully connected Deep Neural Networks (DNNs). Motivated by the fact that CNNs, LSTMs and DNNs are complimentary in their modeling capability, we apply the CLDNNs (Convolutiona...

2016
Timothy J. Draelos Nadine E. Miner Jonathan A. Cox Christopher C. Lamb Conrad D. James James B. Aimone

Deep neural networks (DNNs) have achieved remarkable success on complex data processing tasks. In contrast to biological neural systems, capable of learning continuously, DNNs have a limited ability to incorporate new information in a trained network. Therefore, methods for continuous learning are potentially highly impactful in enabling the application of DNNs to dynamic data sets. Inspired by...

2016
Patrick McClure Nikolaus Kriegeskorte

As deep neural networks (DNNs) are applied to increasingly challenging problems, they will need to be able to represent their own uncertainty. Modeling uncertainty is one of the key features of Bayesian methods. Scalable Bayesian DNNs that use dropout-based variational distributions have recently been proposed. Here we evaluate the ability of Bayesian DNNs trained with Bernoulli or Gaussian dis...

Journal: :CoRR 2016
Antonio Jimeno-Yepes Jianbin Tang

Deep Neural Networks (DNN) have achieved human level performance in many image analytics tasks but DNNs are mostly deployed to GPU platforms that consume a considerable amount of power. Brain-inspired spiking neuromorphic chips consume low power and can be highly parallelized. However, for deploying DNNs to energy efficient neuromorphic chips the incompatibility between continuous neurons and s...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید